Performance Analysis of Classifier Ensembles: Neural Networks Versus Nearest Neighbor Rule

نویسندگان

  • Rosa Maria Valdovinos
  • José Salvador Sánchez
چکیده

We here compare the performance (predictive accuracy and processing time) of different neural network ensembles with that of nearest neighbor classifier ensembles. Concerning the connectionist models, the multilayer perceptron and the modular neural network are employed. Experiments on several real-problem data sets demonstrate a certain superiority of the nearest-neighborbased schemes, in terms of both accuracy and computing time. When comparing the neural network ensembles, one can observe a better behavior of the multilayer perceptron than that of the modular networks.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Classifier ensembles for image identification using multi-objective Pareto features

In this paper we propose classifier ensembles that use multiple Pareto image features for invariant image identification. Different from traditional ensembles that focus on enhancing diversity by generating diverse base classifiers, the proposed method takes advantage of the diversity inherent in the Pareto features extracted using a multi-objective evolutionary Trace Transform algorithm. Two v...

متن کامل

Clustering-based k-nearest neighbor classification for large-scale data with neural codes representation

While standing as one of the most widely considered and successful supervised classification algorithms, the k-Nearest Neighbor (kNN) classifier generally depicts a poor efficiency due to being an instance-based method. In this sense, Approximated Similarity Search (ASS) stands as a possible alternative to improve those efficiency issues at the expense of typically lowering the performance of t...

متن کامل

Prototype reduction techniques: A comparison among different approaches

The main two drawbacks of nearest neighbor based classifiers are: high CPU costs when the number of samples in the training set is high and performance extremely sensitive to outliers. Several attempts of overcoming such drawbacks have been proposed in the pattern recognition field aimed at selecting/gen-erating an adequate subset of prototypes from the training set. The problem addressed in th...

متن کامل

Feature scaling in support vector data description

When in a classification problem only samples of one class are easily accessible, this problem is called a one-class classification problem. Many standard classifiers, like backpropagation neural networks, fail on this data. Some other classifiers, like k-means clustering or nearest neighbor classifier can be applied after some minor changes. In this paper we focus on the support vector data de...

متن کامل

A fast nearest neighbor classifier based on self-organizing incremental neural network

A fast prototype-based nearest neighbor classifier is introduced. The proposed Adjusted SOINN Classifier (ASC) is based on SOINN (self-organizing incremental neural network), it automatically learns the number of prototypes needed to determine the decision boundary, and learns new information without destroying old learned information. It is robust to noisy training data, and it realizes very f...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007